搜索资源列表
H[1].26L中熵编码原理分析及改进
- H.26L中熵编码原理分析及改进.PDF-H.26L entropy coding theory analysis and improvement. PDF
xxl
- 信息论中各种熵的matlab实现,其中包括自信息量,互信息量,条件熵,联合熵,冗余度等等的计算。-various information theory entropy Matlab achieved, which include the volume of information, mutual information, the conditional entropy, Joint entropy, and so on the redundancy terms.
AllShang
- 实现编码理论中各种熵的计算,且输入值是可变的-achieve various coding theory of entropy, and the import value of the variable
shangzhi
- 信息论与编码课程设计:基于概率论与编码熵值计算-Information Theory and Coding Course Design: Based on probability theory and calculation of entropy coding
AsymptoticEquipartitionProperty
- 信息论基础 离散信源的无错编码讲义 4.1 信源编码概述 4.2 无记忆信源的渐近等同分割性 与定长编码定理 4.3 离散无记忆信源的变长编码 4.4 离散平稳信源及其编码定理 4.5 马尔可夫信源的信息熵及 -Information theory-based discrete source of the error-free coding notes 4.1 Overview 4.2 Source Coding of the asymptotic memor
theAntandtheGrasshopper0910
- 信息论中条件熵的仿真,采用Matlab语言进行仿真-Simulation of conditional entropy in information theory
cal_entropy256
- 可以读取256级灰度图片,并且计算该图片的信息熵,信息论要用到-Can read the 256-level grayscale images, and calculate the image of the information entropy, information theory use
athetic_computation
- 1928年, George D. Birkhoff提出了著名的美学度量公式measure = order/complexity.但是order和complexity具体如何计算并没有给出。本文在Shannon信息论的基础上,提出了基于Shannon entropy和kolmogorov complexity的美学度量。作者认为他们“on a promising track with a sound theoretical basis, which not only extends but wil
mi
- matlab 互信息理论主要函数的工具包,包括互信息,熵的计算公式等程序-the main function of mutual information theory matlab toolkit, including the mutual information, entropy calculation
zuidashang
- 图像处理的二维最熵算法分析,此算法中介绍了最大熵的基本理论-Image processing of two-dimensional analysis of the most entropy algorithm, this algorithm introduces the basic theory of maximum entropy
xinxishang
- :将信息论中熵的概念应用到特征选择中,定义了两种信息测度评价特征——误差熵和混叠熵,然后阐述了两种定义的不 用物理意义,分析了计算熵中最关键的区间划分问题,并提出一种较好的区间划分方法。-: The concept of entropy in information theory applied to feature selection, the definition of information measure evaluation of two features- error entro
zuixiaoshang
- 针对多通道卷积混叠模型,基于信息理论的最小熵准则,利用单时间点观测样本给出了一类多通道盲反卷积 方法,这种方法不同于其它方法的地方在于考虑了源和观测信号的上下文信息.-A blind deconvolution algorithmfor a multi- channel convolution mixing model was derived. The algorithm is based on a minimum entropy contrast of information theo
jpegdecode
- jpeg 解码分析,对jpeg 的熵解码,帧结构,还有idct变换的原理和实现做了详细的解释,并做了代码的分析。-jpeg decode analysis, jpeg entropy decoding, frame structure, there idct transform the theory and implementation a detailed explanation, and do the code analysis.
SUANSHU
- 本文由香农熵理论和统计编码的原理开始,逐步展开对基于算术编码的数据压缩的研究与应用的讨论:从算术编码的原理、产生条件、以及研究算术编码的目的意义等,到具体算术编码方案的分析比较以及其C++语言的实现方案-This Yuka agricultural entropy coding theory and statistical theory began to gradually expand the data based on arithmetic coding compression resear
huffmam_coding
- In computer science and information theory, Huffman coding is an entropy encoding algorithm used for lossless data compression
shanghanshu
- 信息论课程里面,信息熵,平均互信息的编码的Matlab源码-Inside information theory courses, information entropy, average mutual information Matlab source code
Image_Feature_Selection_Method_Based_on_Immune_Enc
- 针对目标与背景两类图像模式识别问题,在已有的特征选择方法基础上,提出了一种新颖的基于免疫分子编码机理的图像特征选择方法(IACA). 该方法借鉴生物免疫系统的抗体分 子编码机理,在对样本进行参数估计情况下,提出熵度量单个特征对于目标和背景的识别敏感度 从集合的角度研究并且定义了特征之间的包含和互补关系 并且基于组成抗体分子氨基酸结合能量最小原则,提出了关于图像目标的免疫抗体构建规则 最终实现了寻找最优特征子集的算法IACA ,该特征子集的维数通过算法自动获得无需人为设定,选择结果为目标的“免
information-theory
- Matlab implementation of various entropy in information theory, including the calculation of self information,信息论中各种熵的matlab实现,其中包括自信息量,互信息量,条件熵,联合熵,冗余度等等的计算
熵权法
- 通过引入熵理论来求权重,语言为Matlab,比较好用(The entropy theory is introduced to calculate the weight)
mutualinformation
- This is an implementation of Mutual Information theory in Matlab. This code consists of three parts including: Entropy, Joint Entropy and Mutual Information matlab codes.